Tree Based Orthogonal Least Squares Regression with Repeated Weighted Boosting Search
نویسندگان
چکیده
Orthogonal Least Squares Regression (OLSR) selects each regressor by repeated weighted boosting search (RWBS). This kind of OLSR is known to be capable of producing a much sparser model than many other kernel methods. With the aid of tree structure search, this paper is to construct an even sparser regression model in the framework of OLSR with RWBS. When RWBS being used to solve the optimization at each regression stage, OLSR is extended by keeping the k ( k >1)excellent regressors, which minimize the modeling MSE, rather than only choose the best one at each iteration. In this way, the next regressor will be searched in k subspaces instead of in only one subspace as the conventional method. Furthermore we propose a subtree search to decrease experimental time complexity, by specifying the total number of children in every tree depth. The new schemes are shown to outperform the traditional method in the applications, such as component detection, sparse representation for ECG signal and 2-d time series modeling. Besides, experimental results also indicate that subtree based algorithm is with much lower time complexity than tree based one.
منابع مشابه
An approach for constructing parsimonious generalized Gaussian kernel regression models
The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard...
متن کاملBoosting Weighted Partial Least Squares for Batch Process Quality Prediction
In batch processes, end-product qualities are cumulatively determined by variable dynamic trajectories throughout each batch. Meanwhile, batch processes are inherently time-varying, implying that process variables may have different impacts on end-qualities at different time intervals. To take both the cumulative and the time-varying effects into better consideration for quality prediction, a b...
متن کاملOPTIMAL SHAPE DESIGN OF GRAVITY DAMS BASED ON A HYBRID META-HERURISTIC METHOD AND WEIGHTED LEAST SQUARES SUPPORT VECTOR MACHINE
A hybrid meta-heuristic optimization method is introduced to efficiently find the optimal shape of concrete gravity dams including dam-water-foundation rock interaction subjected to earthquake loading. The hybrid meta-heuristic optimization method is based on a hybrid of gravitational search algorithm (GSA) and particle swarm optimization (PSO), which is called GSA-PSO. The operation of GSA-PSO...
متن کاملHigh-Dimensional $L_2$Boosting: Rate of Convergence
Boosting is one of the most significant developments in machine learning. This paper studies the rate of convergence of L2Boosting, which is tailored for regression, in a high-dimensional setting. Moreover, we introduce so-called “post-Boosting”. This is a post-selection estimator which applies ordinary least squares to the variables selected in the first stage by L2Boosting. Another variant is...
متن کاملModel-based Boosting 2.0 Model-based Boosting 2.0
This is an extended version of the manuscript Torsten Hothorn, Peter Bühlmann, Thomas Kneib, Mattthias Schmid and Benjamin Hofner (2010), Model-based Boosting 2.0. Journal of Machine Learning Research, 11, 2109 – 2113; http://jmlr.csail.mit.edu/papers/v11/hothorn10a.html. We describe version 2.0 of the R add-on package mboost. The package implements boosting for optimizing general risk function...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- JCP
دوره 7 شماره
صفحات -
تاریخ انتشار 2012